DOOM Eternal | Official GeForce RTX 3080 4K Gameplay Video (perf 1:1 compared against 2080 Ti)

Published by

Click here to post a comment for DOOM Eternal | Official GeForce RTX 3080 4K Gameplay Video (perf 1:1 compared against 2080 Ti) on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Show me a Control gameplay with raytracing between the two.
data/avatar/default/avatar20.webp
I don't think that's a kid from someone in the office playing 😀
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
Undying:

Show me a Control gameplay with raytracing between the two.
I don't think that this is the idea here - the idea is that many people complained (and I agree) that the jump in true gaming performance (rasterization) between Turing and Pascal was too little to justify the price and need for upgrade - now you can see the jump in performance between Ampere and Turing - a card that costs a little more than half of RTX2080Ti price has 50% more performance - in 4k, true, which is still not mainstream gaming but the point was made...
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Impressive considering Doom doesn't use RT, or Tensor cores so this is true shader performance here.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
CPC_RedDawn:

Impressive considering Doom doesn't use RT, or Tensor cores so this is true shader performance here.
Impressive indeed...I think Nvidia has a new "Pascal" here.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
CPC_RedDawn:

Impressive considering Doom doesn't use RT, or Tensor cores so this is true shader performance here.
True, but still cherry picked. Lets see performance in 20 different titles on reviews day.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I definitely think Doom is a best case scenario but I feel like other games won't perform too far off. 35-40% over a 2080Ti is pretty nice for $700.
data/avatar/default/avatar01.webp
I was going to say where is the RTRT?! So finally a good Vulken 4K GPU?
data/avatar/default/avatar30.webp
Are we just going to ignore the fact that they are using different version drivers?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
notthat:

Are we just going to ignore the fact that they are using different version drivers?
I don't see why that would matter. I doubt Nvidia is going to post launch buff the 2080Ti at the expense of it's newer product and sales.
data/avatar/default/avatar31.webp
notthat:

Are we just going to ignore the fact that they are using different version drivers?
Obviously not. Someone might make an account to point out just that. So you're saying that my 2070S is up for a perf. boost just in Doom or will it be system wide 😱
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
just to note, this driver is older than the one DF did their video with.
data/avatar/default/avatar39.webp
To put it simply: how do we know how much of the performance difference is due to the GPU and how much is due to the newer driver running Doom Eternal better? Why not compare apples to apples and show footage of the 2080 Ti running with the newer driver? It is just a laughably easy thing to do and so obvious that one must wonder why NVidia chose not to. When a company keeps benchmarks a secret, and even the reviewer that they give an early peek to is disallowed from showing actual FPS numbers, you have to wonder. We are watching an ad, that much is clear. But an ad simply saying 'Bigger! Better! More performance!" is taken one way, and an ad pretending to be quantifying something is taken another way. Here we have NVidia pretending to give numbers, but the methodology is flawed due to the driver versions being different. No reviewer looking to test a product would change 2 parameters at the same time when the test can be run with only 1 parameter changed.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
notthat:

To put it simply: how do we know how much of the performance difference is due to the GPU and how much is due to the newer driver running Doom Eternal better? Why not compare apples to apples and show footage of the 2080 Ti running with the newer driver? It is just a laughably easy thing to do and so obvious that one must wonder why NVidia chose not to. When a company keeps benchmarks a secret, and even the reviewer that they give an early peek to is disallowed from showing actual FPS numbers, you have to wonder. We are watching an ad, that much is clear. But an ad simply saying 'Bigger! Better! More performance!" is taken one way, and an ad pretending to be quantifying something is taken another way. Here we have NVidia pretending to give numbers, but the methodology is flawed due to the driver versions being different. No reviewer looking to test a product would change 2 parameters at the same time when the test can be run with only 1 parameter changed.
Dude, your logic is faulty as hell. The reviews will be out in 2 weeks. Why would they try to embellish something that will ultimately be revealed in 2 weeks? Why would they want to damage their cred by doing that? Think FFS, think! They have NOTHING to gain at this point in time by skewing the numbers when in 2 weeks it will all be clear!
data/avatar/default/avatar02.webp
Because they are trying to sell a product. It's not like in 2 weeks when actual benchmarks and reviews start coming out there will be an outcry that NVidia was lying, but are they trying to fudge the numbers? Would a company do such a thing in its own marketing material? You be the judge.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
They cannot sell a product that is not available before the reviews. The reviews are what count. There are no stupid buyers who ignore reviews and go solely by a company's marketing presentation. Nvidia has more to lose by presenting false info because they can be ridiculed all over the web for it. Well run companies have a rep to maintain, this is something just too stupid for them to do. Whatever you think they gain from this they will get their ass bitten from it tenfold. 🙄
data/avatar/default/avatar36.webp
notthat:

To put it simply: how do we know how much of the performance difference is due to the GPU and how much is due to the newer driver running Doom Eternal better? Why not compare apples to apples and show footage of the 2080 Ti running with the newer driver? It is just a laughably easy thing to do and so obvious that one must wonder why NVidia chose not to.
Experimental drivers that are only tested to work on the RTX 30 series ...?
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
alanm:

True, but still cherry picked. Lets see performance in 20 different titles on reviews day.
Oh I 100% agree, I'd be more impressed if a 3080 could play RDR2 maxed out at 4K above 60fps. Like you said we need more benchies though
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Yawn, dont give a rat ass about 4k. Can not stand 16:9. But its very nice performance. Ill still with my 5700xt. Come on AMD, bring some damn top tier competition.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
Agonist:

Yawn, dont give a rat ass about 4k. Can not stand 16:9. But its very nice performance. Ill still with my 5700xt. Come on AMD, bring some damn top tier competition.
Lol.